Search results for "Dimension reduction"

showing 10 items of 15 documents

Efficient unsupervised clustering for spatial bird population analysis along the Loire river

2015

International audience; This paper focuses on application and comparison of Non Linear Dimensionality Reduction (NLDR) methods on natural high dimensional bird communities dataset along the Loire River (France). In this context, biologists usually use the well-known PCA in order to explain the upstream-downstream gradient.Unfortunately this method was unsuccessful on this kind of nonlinear dataset.The goal of this paper is to compare recent NLDR methods coupled with different data transformations in order to find out the best approach. Results show that Multiscale Jensen-Shannon Embedding (Ms JSE) outperform all over methods in this context.

Clustering Algorithms[ INFO.INFO-TS ] Computer Science [cs]/Signal and Image Processing[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing[INFO.INFO-TS] Computer Science [cs]/Signal and Image ProcessingNonlinear dimension reductionMultiscale Jensen-Shannon EmbeddingDimension ReductionLoire River
researchProduct

Approximation of functions over manifolds : A Moving Least-Squares approach

2021

We present an algorithm for approximating a function defined over a $d$-dimensional manifold utilizing only noisy function values at locations sampled from the manifold with noise. To produce the approximation we do not require any knowledge regarding the manifold other than its dimension $d$. We use the Manifold Moving Least-Squares approach of (Sober and Levin 2016) to reconstruct the atlas of charts and the approximation is built on-top of those charts. The resulting approximant is shown to be a function defined over a neighborhood of a manifold, approximating the originally sampled manifold. In other words, given a new point, located near the manifold, the approximation can be evaluated…

Computational Geometry (cs.CG)FOS: Computer and information sciencesComputer Science - Machine LearningClosed manifolddimension reductionMachine Learning (stat.ML)010103 numerical & computational mathematicsComplex dimensionTopology01 natural sciencesMachine Learning (cs.LG)Volume formComputer Science - GraphicsStatistics - Machine Learningmanifold learningApplied mathematics0101 mathematicsfunktiotMathematicsManifold alignmentAtlas (topology)Applied Mathematicshigh dimensional approximationManifoldGraphics (cs.GR)Statistical manifold010101 applied mathematicsregression over manifoldsComputational Mathematicsout-of-sample extensionComputer Science - Computational Geometrynumeerinen analyysimonistotapproksimointimoving least-squaresCenter manifold
researchProduct

Dimension Estimation in Two-Dimensional PCA

2021

We propose an automated way of determining the optimal number of low-rank components in dimension reduction of image data. The method is based on the combination of two-dimensional principal component analysis and an augmentation estimator proposed recently in the literature. Intuitively, the main idea is to combine a scree plot with information extracted from the eigenvectors of a variation matrix. Simulation studies show that the method provides accurate estimates and a demonstration with a finger data set showcases its performance in practice. peerReviewed

Computer sciencebusiness.industrydimension reductionDimensionality reductionimage dataEstimatorPattern recognitiondimension estimation16. Peace & justiceImage (mathematics)Data modelingData setMatrix (mathematics)scree plotPrincipal component analysisaugmentationArtificial intelligencebusinessEigenvalues and eigenvectors
researchProduct

Dimensional reduction for energies with linear growth involving the bending moment

2008

A $\Gamma$-convergence analysis is used to perform a 3D-2D dimension reduction of variational problems with linear growth. The adopted scaling gives rise to a nonlinear membrane model which, because of the presence of higher order external loadings inducing a bending moment, may depend on the average in the transverse direction of a Cosserat vector field, as well as on the deformation of the mid-plane. The assumption of linear growth on the energy leads to an asymptotic analysis in the spaces of measures and of functions with bounded variation.

Mathematics(all)Asymptotic analysis49J45 49Q20 74K35dimension reductionGeneral Mathematics01 natural sciencesMathematics - Analysis of PDEsTangent measures; bending moments; dimension reductionFOS: Mathematics[MATH.MATH-AP]Mathematics [math]/Analysis of PDEs [math.AP]0101 mathematicsScalingFunctions of bounded variationMathematicsDeformation (mechanics)Applied Mathematics010102 general mathematicsMathematical analysisTangent measures010101 applied mathematicsNonlinear systemΓ-convergenceDimensional reductionBounded variationBending momentbending momentsVector fieldMSC: 49J45; 49Q20; 74K35Analysis of PDEs (math.AP)
researchProduct

3D-2D dimensional reduction for a nonlinear optimal design problem with perimeter penalization

2012

A 3D-2D dimension reduction for a nonlinear optimal design problem with a perimeter penalization is performed in the realm of $\Gamma$-convergence, providing an integral representation for the limit functional.

Optimal designMathematical optimizationIntegral representationdimension reductionDimensionality reductionGeneral Medicinedimension reduction; optimal designPerimeterNonlinear systemMathematics - Analysis of PDEsDimensional reductionConvergence (routing)FOS: MathematicsApplied mathematicsLimit (mathematics)optimal designDimensional reductionMathematicsAnalysis of PDEs (math.AP)
researchProduct

Data-driven analysis for fMRI during naturalistic music listening

2017

Interest towards higher ecological validity in functional magnetic resonance imaging (fMRI) experiments has been steadily growing since the turn of millennium. The trend is reflected in increasing amount of naturalistic experiments, where participants are exposed to the real-world complex stimulus and/or cognitive tasks such as watching movie, playing video games, or listening to music. Multifaceted stimuli forming parallel streams of input information, combined with reduced control over experimental variables introduces number of methodological challenges associated with isolating brain responses to individual events. This exploratory work demonstrated some of those methodological challeng…

PCAfMRIdimension pienennysmusiikkisignaalianalyysikognitiiviset prosessitkuunteleminenpääkomponenttianalyysinaturalistic experimenttoiminnallinen magneettikuvausICACCAaivotkernel PCA dimension reduction
researchProduct

Snowball ICA: A Model Order Free Independent Component Analysis Strategy for Functional Magnetic Resonance Imaging Data

2020

In independent component analysis (ICA), the selection of model order (i.e., number of components to be extracted) has crucial effects on functional magnetic resonance imaging (fMRI) brain network analysis. Model order selection (MOS) algorithms have been used to determine the number of estimated components. However, simulations show that even when the model order equals the number of simulated signal sources, traditional ICA algorithms may misestimate the spatial maps of the signal sources. In principle, increasing model order will consider more potential information in the estimation, and should therefore produce more accurate results. However, this strategy may not work for fMRI because …

Scale (ratio)Computer sciencedimension reduction050105 experimental psychologylcsh:RC321-57103 medical and health sciencestoiminnallinen magneettikuvaus0302 clinical medicineSoftwareComponent (UML)0501 psychology and cognitive sciencesmutual informationlcsh:Neurosciences. Biological psychiatry. NeuropsychiatrySelection (genetic algorithm)Original Researchmodel ordersignaalinkäsittelyNoise (signal processing)business.industryGeneral NeuroscienceDimensionality reduction05 social sciencessignaalianalyysiriippumattomien komponenttien analyysiPattern recognitionMutual informationIndependent component analysisfunctional magnetic resonance imagingindependent component analysisArtificial intelligencebusiness030217 neurology & neurosurgeryNeuroscienceFrontiers in Neuroscience
researchProduct

Exploring regression structure with graphics

1993

We investigate the extent to which it may be possible to carry out a regression analysis using graphics alone, an idea that we refer to asgraphical regression. The limitations of this idea are explored. It is shown that graphical regression is theoretically possible with essentially no constraints on the conditional distribution of the response given the predictors, but with some conditions on marginal distribution of the predictors. Dimension reduction subspaces and added variable plots play a central role in the development. The possibility of useful methodology is explored through two examples.

Statistics and ProbabilityPolynomial regressionEconometricsSufficient dimension reductionPartial regression plotRegression analysisCross-sectional regressionConditional probability distributionStatistics Probability and UncertaintyMarginal distributionSegmented regressionMathematicsTest
researchProduct

On the usage of joint diagonalization in multivariate statistics

2022

Scatter matrices generalize the covariance matrix and are useful in many multivariate data analysis methods, including well-known principal component analysis (PCA), which is based on the diagonalization of the covariance matrix. The simultaneous diagonalization of two or more scatter matrices goes beyond PCA and is used more and more often. In this paper, we offer an overview of many methods that are based on a joint diagonalization. These methods range from the unsupervised context with invariant coordinate selection and blind source separation, which includes independent component analysis, to the supervised context with discriminant analysis and sliced inverse regression. They also enco…

Statistics and ProbabilityScatter matricesMultivariate statisticsContext (language use)010103 numerical & computational mathematics01 natural sciencesBlind signal separation010104 statistics & probabilitySliced inverse regression0101 mathematicsB- ECONOMIE ET FINANCESupervised dimension reductionMathematicsNumerical Analysisbusiness.industryCovariance matrixPattern recognitionriippumattomien komponenttien analyysimatemaattinen tilastotiedeLinear discriminant analysisInvariant component selectionIndependent component analysismonimuuttujamenetelmätPrincipal component analysisDimension reductionBlind source separationArtificial intelligenceStatistics Probability and Uncertaintybusiness
researchProduct

Dimension reduction for time series in a blind source separation context using r

2021

Funding Information: The work of KN was supported by the CRoNoS COST Action IC1408 and the Austrian Science Fund P31881-N32. The work of ST was supported by the CRoNoS COST Action IC1408. The work of JV was supported by Academy of Finland (grant 321883). We would like to thank the anonymous reviewers for their comments which improved the paper and package considerably. Publisher Copyright: © 2021, American Statistical Association. All rights reserved. Multivariate time series observations are increasingly common in multiple fields of science but the complex dependencies of such data often translate into intractable models with large number of parameters. An alternative is given by first red…

Statistics and ProbabilitySeries (mathematics)Stochastic volatilityComputer scienceblind source separation; supervised dimension reduction; RsignaalinkäsittelyDimensionality reductionRsignaalianalyysiContext (language use)CovarianceBlind signal separationQA273-280aikasarja-analyysiR-kieliDimension (vector space)monimuuttujamenetelmätBlind source separationStatistics Probability and UncertaintyTime seriesAlgorithmSoftwareSupervised dimension reduction
researchProduct